Consistency of ℓ1-regularized maximum-likelihood for compressive Poisson regression
نویسندگان
چکیده
We consider Poisson regression with the canonical link function. This regression model is widely used in regression analysis involving count data; one important application in electrical engineering is transmission tomography. In this paper, we establish the variable selection consistency and estimation consistency of the `1-regularized maximum-likelihood estimator in this regression model, and characterize the asymptotic sample complexity that ensures consistency even under the compressive sensing setting (or the n p setting in highdimensional statistics).
منابع مشابه
Jackknifed Liu-type Estimator in Poisson Regression Model
The Liu estimator has consistently been demonstrated to be an attractive shrinkage method for reducing the effects of multicollinearity. The Poisson regression model is a well-known model in applications when the response variable consists of count data. However, it is known that multicollinearity negatively affects the variance of the maximum likelihood estimator (MLE) of the Poisson regressio...
متن کاملConsistency and asymptotic normality of the maximum likelihood estimator in a zero-inflated generalized Poisson regression
Poisson regression models for count variables have been utilized in many applications. However, in many problems overdispersion and zeroinflation occur. We study in this paper regression models based on the generalized Poisson distribution (Consul (1989)). These regression models which have been used for about 15 years do not belong to the class of generalized linear models considered by McCull...
متن کاملHurdle, Inflated Poisson and Inflated Negative Binomial Regression Models for Analysis of Count Data with Extra Zeros
In this paper, we propose Hurdle regression models for analysing count responses with extra zeros. A method of estimating maximum likelihood is used to estimate model parameters. The application of the proposed model is presented in insurance dataset. In this example, there are many numbers of claims equal to zero is considered that clarify the application of the model with a zero-inflat...
متن کاملEstimation and Selection via Absolute Penalized Convex Minimization And Its Multistage Adaptive Applications
The ℓ1-penalized method, or the Lasso, has emerged as an important tool for the analysis of large data sets. Many important results have been obtained for the Lasso in linear regression which have led to a deeper understanding of high-dimensional statistical problems. In this article, we consider a class of weighted ℓ1-penalized estimators for convex loss functions of a general form, including ...
متن کاملA Generic Path Algorithm for Regularized Statistical Estimation.
Regularization is widely used in statistics and machine learning to prevent overfitting and gear solution towards prior information. In general, a regularized estimation problem minimizes the sum of a loss function and a penalty term. The penalty term is usually weighted by a tuning parameter and encourages certain constraints on the parameters to be estimated. Particular choices of constraints...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2015